Click here to join our community of experts to get information on job search, salaries and more.

Data Engineer

Company: Robert Half

Location: Ann Arbor, MI

Posted on: October 29

Our client is undergoing a major digital transformation, shifting toward a cloud-native, API-driven infrastructure. Theyre looking for a Data Engineer to help build a modern, scalable data platform that supports this evolution. This role will focus on creating secure, efficient data pipelines, preparing data for analytics, and enabling real-time data sharing across systems.

As the organization transitions from older, legacy systems to more dynamic, event-based and API-integrated models, the Data Engineer will be instrumental in modernizing the data environmentparticularly across the bronze, silver, and gold layers of their medallion architecture.


Key Responsibilities:

  • Design and deploy scalable data pipelines in Azure using tools like Databricks, Spark, Delta Lake, DBT, Dagster, Airflow, and Parquet.
  • Build workflows to ingest data from various sources (e.g., SFTP, vendor APIs) into Azure Data Lake.
  • Develop and maintain data transformation layers (Bronze/Silver/Gold) within a medallion architecture.
  • Apply data quality checks, deduplication, and validation logic throughout the ingestion process.
  • Create reusable and parameterized notebooks for both batch and streaming data jobs.
  • Implement efficient merge/update logic in Delta Lake using partitioning strategies.
  • Work closely with business and application teams to gather and deliver data integration needs.
  • Support downstream integrations with APIs, Power BI dashboards, and SQL-based reports.
  • Set up monitoring, logging, and data lineage tracking using tools like Unity Catalog and Azure Monitor.
  • Participate in code reviews, design sessions, and agile backlog grooming.

Additional Technical Duties:

  • SQL Server Development: Write and optimize stored procedures, functions, views, and indexing strategies for high-performance data processing.
  • ETL/ELT Processes: Manage data extraction, transformation, and loading using SSIS and SQL batch jobs.

Tech Stack:

  • Languages & Frameworks: Python, C#, .NET Core, SQL, T-SQL
  • Databases & ETL Tools: SQL Server, SSIS, SSRS, Power BI
  • API Development: ASP.NET Core Web API, RESTful APIs
  • Cloud & Data Services (Roadmap): Azure Data Factory, Azure Functions, Azure Databricks, Azure SQL Database, Azure Data Lake, Azure Storage
  • Streaming & Big Data (Roadmap): Delta Lake, Databricks, Kafka (preferred but not required)
  • Governance & Security: Data integrity, performance tuning, access control, compliance
  • Collaboration Tools: Jira, Confluence, Visio, Smartsheet